8 - Pattern Recognition (PR) [ID:2461]
50 von 839 angezeigt

[MUSIK]

the following content has been provided by the

University of Erlangen Nürnberg so let's just continue where we stopped yesterday

the big picture is still clear yesterday I draw the big picture on

the blackboard today we will talk about discriminate analysis

and basically we are back to the key equation

of pattern recognition it's a very important relationship between

the posterior probabilities that are important because the Bayesian

classifier makes use of posteriors for for the decision

process and we rewrote the posterior by using priors

and class conditional both in the numerator and the denominator and what we will now

consider in the following is or are the following three things

first of all we know about the Gaussian classifier the

Gaussian classifier makes use of a class conditional that is

a normal distribution a normal PDF basically and we know about the decision

boundary and the geometry of the decision boundary of the Gaussian classifier we

have a degree two polynomial that defines the decision

boundary basically and we also know the nice side effect and that's what

I have shown yesterday also on the blackboard if the features are normally distributed and the Gaussian

the Gaussian covariance matrix is the identity matrix we end up

with a nearest neighbor classifier might be a good idea to

say okay let's find a feature transform to transform the features

into a different space where the features are normally distributed with

a identity covariance matrix that's one thing we will consider today the other

thing is we have seen a few lectures ago if we have a decision boundary that

is a degree N polynomial we can transform the features into higher dimensional space

where we end up with a linear decision boundary remember this trick

where we have just considered instead of feature vector a feature vector which carries

the monomial X1 to the power of 2 X2 to the power of 2 and

so on so we went into a 5 6 or 7 dimensional space from a two

dimensional space and ended up with a linear decision boundary okay second and the third

thing is what we will also consider in detail if

we are in high dimensional feature spaces are there any

subspaces where the classification problem can already be solved sufficiently

with the lower dimensional features and that's also something that we want to consider today in

in terms of the geometry so we are feature transformations we want to look at

so what we do is the following we transform we will consider three things

one thing is first we that's the German European one

the British the American one is this here so

Obama says I am number one and he writes this that means he is

number seven right so number one Obama is a topic today

very interesting what's going to okay any guess

any guess that's the European view if you go

to the US yes okay good so the

first thing so what did I want to say we look

for a feature transform that transforms the feature X to

another feature let's say x-prime can be higher dimensional lower dimensional

or the same dimension and x-prime is required to

fulfill a certain PDF so we will try to find transformations of

features such that we end up with features that have a certain probability density

Teil einer Videoserie :

Zugänglich über

Offener Zugang

Dauer

01:26:02 Min

Aufnahmedatum

2012-11-06

Hochgeladen am

2012-12-04 09:09:28

Sprache

en-US

Einbetten
Wordpress FAU Plugin
iFrame
Teilen